Conversation
…into feat/byok
|
@kah-seng Thanks for the draft, a couple quick answers to your questions:
I’ll help review the code and get this feature shipped as soon as possible. |
There was a problem hiding this comment.
Pull request overview
This PR introduces “Bring Your Own Key (BYOK)” support so users can configure their own OpenAI-compatible endpoints/API keys via settings, and have those custom models appear in model selection and be used by the backend when creating chat completions.
Changes:
- Add
CustomModelto user settings (proto + backend model + mapper) and surface custom models viaListSupportedModelsV2. - Update the webapp settings UI to CRUD custom models and update the model selector to label them as custom.
- Update the AI client/config plumbing to route requests through user-provided endpoint/API key when a custom model is selected.
Reviewed changes
Copilot reviewed 18 out of 18 changed files in this pull request and generated 7 comments.
Show a summary per file
| File | Description |
|---|---|
| webapp/_webapp/src/views/settings/sections/api-key-settings.tsx | New BYOK settings modal + custom model CRUD UI wired into settings store |
| webapp/_webapp/src/views/chat/footer/toolbar/model-selection.tsx | Appends “(Custom)” to custom model subtitles |
| webapp/_webapp/src/pkg/gen/apiclient/user/v1/user_pb.ts | Regenerated TS API types to include CustomModel + Settings.customModels |
| webapp/_webapp/src/pkg/gen/apiclient/chat/v2/chat_pb.ts | Regenerated TS API types to include SupportedModel.isCustom |
| webapp/_webapp/src/hooks/useLanguageModels.ts | Adds isCustom field and filters supported models list |
| proto/user/v1/user.proto | Adds CustomModel and Settings.custom_models |
| proto/chat/v2/chat.proto | Adds SupportedModel.is_custom |
| pkg/gen/api/user/v1/user.pb.go | Regenerated Go API for new user settings fields |
| pkg/gen/api/chat/v2/chat.pb.go | Regenerated Go API for SupportedModel.is_custom |
| internal/models/user.go | Adds backend CustomModel and stores it on user Settings |
| internal/api/mapper/user.go | Maps custom_models between proto and DB model |
| internal/models/llm_provider.go | Adds IsCustomModel to provider config |
| internal/api/chat/list_supported_models_v2.go | Returns user custom models as supported models |
| internal/api/chat/create_conversation_message_stream_v2.go | Selects provider config based on chosen custom model |
| internal/services/toolkit/client/client_v2.go | Uses user-provided endpoint/key for custom models |
| internal/services/toolkit/client/utils_v2.go | Adjusts default params based on custom vs non-custom |
| internal/services/toolkit/client/completion_v2.go | Passes IsCustomModel through to default param selection |
| internal/services/toolkit/client/get_conversation_title_v2.go | Uses conversation model for title generation when custom model is active |
Comments suppressed due to low confidence (1)
internal/services/toolkit/client/client_v2.go:67
- Allowing users to set an arbitrary
Endpointthat is then passed tooption.WithBaseURLintroduces a server-side SSRF vector (the backend will make HTTP requests to any URL the user provides). Add server-side validation/allowlisting for custom endpoints (e.g., require https, block localhost/private IP ranges, and/or restrict to known domains), ideally when persisting settings and before creating the client.
func (a *AIClientV2) GetOpenAIClient(llmConfig *models.LLMProviderConfig) *openai.Client {
var Endpoint string = llmConfig.Endpoint
var APIKey string = llmConfig.APIKey
if !llmConfig.IsCustomModel {
if Endpoint == "" {
if APIKey != "" {
// User provided their own API key, use the OpenAI-compatible endpoint
Endpoint = a.cfg.OpenAIBaseURL // standard openai base url
} else {
// suffix needed for cloudflare gateway
Endpoint = a.cfg.InferenceBaseURL + "/openrouter"
}
}
if APIKey == "" {
APIKey = a.cfg.InferenceAPIKey
}
}
opts := []option.RequestOption{
option.WithAPIKey(APIKey),
option.WithBaseURL(Endpoint),
}
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
webapp/_webapp/src/views/settings/sections/api-key-settings.tsx
Outdated
Show resolved
Hide resolved
webapp/_webapp/src/views/settings/sections/api-key-settings.tsx
Outdated
Show resolved
Hide resolved
4ndrelim
left a comment
There was a problem hiding this comment.
lgtm. Just need to resolve merge conflicts and some of copilot's comments.
Some parts might be glossing over me and I am not too sure yet, but i will approve for merging into staging and lets test there. I will do a final review before merging to main.
This PR aims to allow users to use their own OpenAI-compatible endpoints and API keys (including those outside of OpenAI). The implementation is not completely done yet but I am creating this draft PR to get earlier feedback.
Settings
Replaces the OpenAI key input in the settings previously. Users can specify a name, slug, base URL and their API key.

Model Selection
Models with user-specified API keys have a "(Custom)" appended to the slug.

Questions
Todo